Stochastic subgradient method converges at the rate O(k-1/4) on weakly convex functions

نویسندگان

  • Damek Davis
  • Dmitriy Drusvyatskiy
چکیده

We prove that the projected stochastic subgradient method, applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rateO(k−1/4).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Complexity of finding near-stationary points of convex functions stochastically

In the recent paper [3], it was shown that the stochastic subgradient method applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rate O(k−1/4). In this supplementary note, we present a stochastic subgradient method for minimizing a convex function, with the improved rate Õ(k−1/2).

متن کامل

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions—a wide class of functions which includes the additive and convex composite classes. At a high-level, the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochast...

متن کامل

Stochastic model-based minimization of weakly convex functions

We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate O(k−1/4). Our result yields the first complexity guarantees for the stochastic proximal point algorithm on weakly convex problems...

متن کامل

On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging

This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use o...

متن کامل

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T ) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor’s classic subgradient analysis and implies generalizations of the standard convergenc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.02988  شماره 

صفحات  -

تاریخ انتشار 2018